Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quiz exercises: Fix an issue with drag and drop positioning #8265

Merged
merged 4 commits into from
Mar 28, 2024

Conversation

matthiaslehnertum
Copy link
Contributor

@matthiaslehnertum matthiaslehnertum commented Mar 27, 2024

This PR hotfixes the positioning of elements when dragging and dropping elements in Quiz exercises.

Checklist

General

  • I tested all changes and their related features with all corresponding user types on a test server.
  • This is a small issue that I tested locally and was confirmed by another developer on a test server.
  • I chose a title conforming to the naming conventions for pull requests.

Client

  • Important: I implemented the changes with a very good performance, prevented too many (unnecessary) REST calls and made sure the UI is responsive, even with large data.
  • I strictly followed the client coding and design guidelines.

Motivation and Context

At the moment, elements dragged for quiz exercises and positioned incorrectly. This PR fixes this behavior.

Description

The computeDropLocation function in quiz-exercise-generator.ts wrongly assumes all SVGs start from {x: 0, y: 0}, which is not the case. it calculates drop location positions as a percentage of the total width, but because of not offsetting the starting position of the SVG clip. This PR fixes this issue by offsetting the calculation by the totalSize's x and y coordinates.

Steps for Testing

Prerequisites:

  • 1 Student
  • 1 Quiz exercise
  1. Log in to Artemis
  2. Navigate to a Quiz exercise
  3. Move answering options via drag-and-drop
  4. The answering options should now be correctly placed relative to the mouse cursor

Exam Mode Testing

Prerequisites:

  • 1 Student
  • 1 Exam with a Quiz Exercise
  1. Log in to Artemis
  2. Participate in the exam as a student
  3. Move answering options via drag-and-drop
  4. The answering options should now be correctly placed relative to the mouse cursor

Testserver States

Note

These badges show the state of the test servers.
Green = Currently available, Red = Currently locked







Review Progress

Performance Review

  • I (as a reviewer) confirm that the client changes (in particular related to REST calls and UI responsiveness) are implemented with a very good performance
  • I (as a reviewer) confirm that the server changes (in particular related to database calls) are implemented with a very good performance

Code Review

  • Code Review 1
  • Code Review 2

Manual Tests

  • Test 1
  • Test 2

Exam Mode Test

  • Test 1
  • Test 2

Test Coverage

Screenshots

Summary by CodeRabbit

Summary by CodeRabbit

  • Refactor
    • Enhanced element positioning calculation in quiz exercise generation for improved accuracy.

@matthiaslehnertum matthiaslehnertum requested a review from a team as a code owner March 27, 2024 21:39
Copy link

coderabbitai bot commented Mar 27, 2024

Walkthrough

The recent update involves refining the computeDropLocation function in quiz-exercise-generator.ts within the quiz management module. Additional parameters totalSize.x and totalSize.y have been introduced to adjust the positioning (posX and posY) of elements based on the total size of the container. This enhancement ensures precise element placement within the quiz exercise diagrams.

Changes

File Change Summary
.../quiz/manage/apollon-diagrams/exercise-generation/quiz-exercise-generator.ts Updated computeDropLocation function with totalSize.x and totalSize.y parameters for accurate element positioning.

Possibly related issues

  • Exam: Divide the Testing Process into Steps #6621: The changes in this update align with the objective of refining the testing process during exam programming exercise creation by ensuring accurate element positioning, which is crucial for maintaining the integrity of the testing environment.

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share

Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit-tests for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit tests for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit tests.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • The JSON schema for the configuration file is available here.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/coderabbit-overrides.v2.json

CodeRabbit Discord Community

Join our Discord Community to get help, request features, and share feedback.

@github-actions github-actions bot added the client Pull requests that update TypeScript code. (Added Automatically!) label Mar 27, 2024
Copy link

⚠️ Unable to deploy to test servers ⚠️

Testserver "artemis-test3.artemis.cit.tum.de" is already in use by PR #8263.

@github-actions github-actions bot added deployment-error Added by deployment workflows if an error occured and removed deployment-error Added by deployment workflows if an error occured labels Mar 27, 2024
@julian-christl julian-christl temporarily deployed to artemis-test2.artemis.cit.tum.de March 27, 2024 22:00 — with GitHub Actions Inactive
@krusche krusche changed the title Quiz Exercises: Rectify drag-and-drop positioning Quiz exercises: Fix an issue with drag and drop positioning Mar 28, 2024
@krusche krusche added this to the 6.9.5 milestone Mar 28, 2024
@krusche krusche merged commit 0c83f1f into develop Mar 28, 2024
25 of 33 checks passed
@krusche krusche deleted the hotfix/fix-quiz-exercises branch March 28, 2024 22:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
client Pull requests that update TypeScript code. (Added Automatically!)
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

3 participants